Meta hires Oslo-based “supercomputing” experts to boost AI infrastructure
Meta Platforms Inc has hired an Oslo-based team that until the end of last year built AI network technology for British chip unicorn Graphcore.
A Meta spokesperson confirmed the hiring in response to a request for comment after Reuters identified 10 people whose LinkedIn profiles indicated they worked at Graphcore until December 2022 or January 2023 and later joined Meta in February or March of this year.
“We have recently welcomed a number of highly specialized engineers to Oslo to our infrastructure team at Meta. They bring deep expertise in the design and development of supercomputing systems that support artificial intelligence and machine learning at scale in Meta’s data centers,” said Jon Carvill, Meta’s spokesperson.
The move adds momentum to the social media giant’s efforts to improve AI work in its data centers as it competes for demand for AI-oriented infrastructure among teams looking to build new features.
Meta, which owns Facebook and Instagram, has become increasingly reliant on artificial intelligence technology to target advertising, select app feeds and remove banned content from its platforms.
Moreover, it is now racing with rivals such as Microsoft Corp and Alphabet Inc’s Google to release generative artificial intelligence products capable of creating human-like writing, art and other content, which investors see as the next big growth area for tech companies.
Ten employees’ job descriptions on LinkedIn showed the team had worked on AI-specific network technology at Graphcore, which develops computer chips and systems optimized for AI work.
Carvill declined to say what they would do in Meta.
Graphcore closed its Oslo office as part of a wider restructuring announced in October last year, a spokesman for the startup said, as it struggled to make inroads against US companies such as Nvidia Corp and Advanced Micro Devices Inc, which dominate the market for artificial intelligence chips. .
Metalla already has its own unit that designs a variety of chips aimed at speeding up and maximizing its AI work, including a networking chip that performs a kind of air traffic control function for servers, two of the sources told Reuters.
A powerful network is particularly useful for modern AI systems, such as those behind the chatbot ChatGPT or the imaging tool Dall-E, which are far too large to fit on a single computing chip and must instead be distributed across multiple interconnected chips.
A new class of network chips has emerged to help keep data moving smoothly across these compute clusters. Nvidia, AMD and Intel Corp all make such networking chips.
In addition to the network chip, Meta is also planning a complex computing chip for both training AI models and reasoning, the process by which trained models make judgments and respond to prompts, though it doesn’t expect the chip to be ready until around 2025.
Investors such as Microsoft and venture capital firm Sequoia once viewed Graphcore, one of Britain’s most valuable technology companies, as a promising potential challenger to Nvidia’s dominant position in the artificial intelligence systems market.
However, it faced a setback in 2020 when Microsoft scrapped an early deal to buy Graphcore chips for its Azure cloud computing platform, according to a report in Britain’s The Times. Instead, Microsoft used Nvidia GPUs to build the massive infrastructure that supports ChatGPT developer OpenAI, which Microsoft also supports.
Sequoia has since written down its investment in Graphcore to zero, although it remains on the company’s board, according to a source familiar with the relationship. Insider first reported the write-down in October.
A Graphcore spokesperson confirmed the setbacks, but said the company has “full potential” to capitalize on the commercial deployment of AI.
Graphcore was last valued at $2.8 billion after raising $222 million in its most recent investment round in 2020.
Read all the Latest Tech News here.